Upper Bounds on the Error of Sparse Vector and Low-Rank Matrix Recovery

نویسندگان

  • Mohammadreza Malek-Mohammadi
  • Cristian R. Rojas
  • Magnus Jansson
  • Massoud Babaie-Zadeh
چکیده

Suppose that a solution x̃ to an underdetermined linear system b = Ax is given. x̃ is approximately sparse meaning that it has a few large components compared to other small entries. However, the total number of nonzero components of x̃ is large enough to violate any condition for the uniqueness of the sparsest solution. On the other hand, if only the dominant components are considered, then it will satisfy the uniqueness conditions. One intuitively expects that x̃ should not be far from the true sparse solution x0. We show that this intuition is the case by providing an upper bound on ‖̃x − x0‖ which is a function of the magnitudes of small components of x̃ but independent from x0. This result is extended to the case that b is perturbed by noise. Additionally, we generalize the upper bounds to the low-rank matrix recovery problem.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Sharp Sufficient Condition for Sparsity Pattern Recovery

Sufficient number of linear and noisy measurements for exact and approximate sparsity pattern/support set recovery in the high dimensional setting is derived. Although this problem as been addressed in the recent literature, there is still considerable gaps between those results and the exact limits of the perfect support set recovery. To reduce this gap, in this paper, the sufficient con...

متن کامل

Low-Rank and Sparse Modeling of High-dimensional Vector Autoregressions

Network modeling of high-dimensional time series in presence of unobserved latent variables is an important problem in macroeconomics and finance. In macroeconomic policy making and forecasting, it is often impossible to observe and incorporate all the relevant series in the analysis. Failure to include these variables often results in spurious connectivity among the observed time series in str...

متن کامل

Noisy matrix decomposition via convex relaxation: Optimal rates in high dimensions

We analyze a class of estimators based on convex relaxation for solving high-dimensional matrix decomposition problems. The observations are noisy realizations of a linear transformation X of the sum of an (approximately) low rank matrix Θ⋆ with a second matrix Γ⋆ endowed with a complementary form of low-dimensional structure; this set-up includes many statistical models of interest, including ...

متن کامل

Sufficient Conditions for Low-rank Matrix Recovery, Translated from Sparse Signal Recovery

The low-rank matrix recovery (LMR) is a rank minimization problem subject to linear equality constraints, and it arises in many fields such as signal and image processing, statistics, computer vision, system identification and control. This class of optimization problems is NP-hard and a popular approach replaces the rank function with the nuclear norm of the matrix variable. In this paper, we ...

متن کامل

Sparse and Low Rank Recovery

Compressive sensing (sparse recovery) is a new area in mathematical image and signal processing that predicts that sparse signals can be recovered from what was previously believed to be highly incomplete measurement [3, 5, 7, 12]. Recently, the ideas of this field have been extended to the recovery of low rank matrices from undersampled information [6, 8]; most notably to the matrix completion...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Signal Processing

دوره 120  شماره 

صفحات  -

تاریخ انتشار 2016